14 research outputs found

    Editor's Note

    Get PDF
    Artificial Intelligence has become nowadays one of the main relevant technologies that is driven us to a new revolution, a change in society, just as well as other human inventions, such as navigation, steam machines, or electricity did in our past. There are several ways in which AI might be developed, and the European Union has chosen a path, a way to transit through this revolution, in which Artificial Intelligence will be a tool at the service of Humanity. That was precisely the motto of the 2020 European Conference on Artificial Intelligence (“Paving the way towards Human-Centric AI”), of which these special issue is a selection of the best papers selected by the organizers of some of the Workshops in ECAI 2020

    Modelling Measurement Processes as Timed Information Processes in Simplex Domains

    No full text
    This paper presents a domain-theoretic model for measurements and measuring instruments, by making explicit in simplex-domain structures two important aspects of measurement processes: the notion of standard representation relation, established between the (physical) values that are being measured and the meanings of the readings (semantic values) of the measuring instruments used to measure them, and the time time underlying every measurement process, in a way that it is possible to trace the hostory of every measuring process. We also present the modelling of measurements performed by combined measuring instruments synchrfonized in time. Finally, the domain-theoretic modelling of a sample measuring process is presented to imllustrate the approach

    From Intervals to? Towards a General Description of Validated Uncertainty

    No full text
    In many real-life situations, we are interested in the physical quantities that are difficult or even impossible to measure directly. To estimate the value of such quantity y, we measure the values of auxiliary quantities x1,...,xn that are related to y by a known functional relation y=f(x1,...,xn), and we then use the results Xi of measuring xi to find the desired estimate Y=f(X1,..,Xn). Due to measurement errors, the measured values Xi are slightly different from the actual (unknown) values xi; as a result, our estimate Y is different from the actual value y=f(x1,...,xn) of the desired quantity. When xi and y are numbers, then the measurement accuracy can be usually represented in interval terms, and interval computations can be used to estimate the resulting uncertainty in y. In some real-life problems, what we are interested in is more complex than a number. For example, we may be interested in the dependence of the one physical quantity x1 on another one x2: we may be interested in how the material strain depends on the applied stress, or in how the temperature depends on a point in 3-D space; in all such cases, what we are interested in is a function. We may be interested in even more complex structures: e.g., in quantum mechanics, measuring instruments are described by operators in a Hilbert space, so if we want to have a precise description of an actual (imperfect) measuring instrument, what we are interested in is an operator. For many of such mathematical structures, researchers have developed ways to represent uncertainty, but usually, for each new structure, we have to perform a lot of complex analysis from scratch. It is desirable to come up with a general methodology that would automatically produce a natural description of validated uncertainty for all physically interesting situations (or at least for as many such situations as possible). In this paper, we produce the foundations for such a methodology; it turns out that this problem naturally leads to the technique of domains first introduced by D. Scott in the 1970s

    ON THE RELATION INCLUSION OF ZONOTOPES IN THE PLANE *

    No full text
    The theory of quasivector spaces has been applied for a class of zonotopes in the plane defined as positive combinations of basic centered segments. This leads to an implicit presentation of the zonotopes by means of two vectors: one for the center and one for the centered zonotope obtained by translation of the original zonotope in the origin. Using this presentation, the order relation inclusion of zonotopes in the plane has been studied. More specifically, sufficient conditions for inclusion have been stated in terms of the implicit presentation

    The Interval Categorizer Tesselation-Based Model for High Perfomance Computing

    No full text
    The paper presents the results obtained by an implementation of the interval tessellation-based model for categorization of geographic regions according the analysis of the relief function declivity, called ICTM. The analysis of the relief declivity, which is embedded in the rules of the model ICTM, categorizes each tessellation cell, with respect to the whole considered region, according to the (positive, negative, null) signal of the declivity of the cell. Such information is represented in the states assumed by the cells of the model. The overall configuration of such cells allows the division of the region into sub-regions of cells belonging to the same category, that is, presenting the same declivity signal. In order to control the errors coming from the discretization of the region into tessellation cells, or resulting from numerical computations, interval techniques are used

    Properties of fuzzy implications obtained via the interval constructor

    No full text
    ABSTRACT The extension of classical logic connective to the unit interval [0

    From Intervals to Domains: Towards a General Description of Validated Uncertainty, with Potential Applications to Geospatial and Meteorological Data

    No full text
    When physical quantities xi are numbers, then the corresponding measurement accuracy can be usually represented in interval terms, and interval computations can be used to estimate the resulting uncertainty in y=f(x1,...,xn). In some practical problems, we are interested in more complex structures such as functions, operators, etc. Examples: we may be interested in how the material strain depends on the applied stress, or in how a physical quantity such as temperature or velocity of sound depends on a 3-D point. For many such structures, there are ways to represent uncertainty, but usually, for each new structure, we have to perform a lot of complex analysis from scratch. It is desirable to come up with a general methodology that would automatically produce a natural description of validated uncertainty for all physically interesting situations (or at least for as many such situations as possible). In this talk, we describe the foundations for such a methodology; it turns out that this problem naturally leads to the technique of domains first introduced by D. Scott in the 1970s. In addition to general domain techniques, we also describe applications to geospatial and meteorological data

    From Intervals to? Towards a General Description of Validated Uncertainty

    No full text
    In many real-life situations, we are interested in the physical quantities that are difficult or even impossible to measure directly. To estimate the value of such quantity y, we measure the values of auxiliary quantities x1 ; : : : ; xn that are related to y by a known functional relation y = f(x1 ; : : : ; xn ), and we then use the results e x i of measuring x i to find the desired estimate e y = f(ex1 ; : : : ; e xn ). Due to measurement errors, the measured values e x i are slightly different from the actual (unknown) values x i ; as a result, our estimate e y is different from the actual value y = f(x1 ; : : : ; xn) of the desired quantity

    HPC-ICTM: the interval categorizer tesselation-based model for high perfomance computing

    No full text
    The paper presents the results obtained by an implementation of the interval tessellation-based model for categorization of geographic regions according the analysis of the relief function declivity, called ICTM. The analysis of the relief declivity, which is embedded in the rules of the model ICTM, categorizes each tessellation cell, with respect to the whole considered region, according to the (positive, negative, null) signal of the declivity of the cell. Such information is represented in the states assumed by the cells of the model. The overall configuration of such cells allows the division of the region into sub-regions of cells belonging to the same category, that is, presenting the same declivity signal. In order to control the errors coming from the discretization of the region into tessellation cells, or resulting from numerical computations, interval techniques are used
    corecore